TF: Unpack model inputs through a decorator #15907
TF: Unpack model inputs through a decorator #15907gante merged 10 commits intohuggingface:masterfrom
Conversation
|
The documentation is not available anymore as the PR was closed or merged. |
There was a problem hiding this comment.
No more need for this ugly work-around for non-NLP models 😎
sgugger
left a comment
There was a problem hiding this comment.
So nice and elegant! Love how it makes the modeling code so much more readable!
|
This looks great! Also pinging @patrickvonplaten for review as it's quite central |
|
Thanks for running the tests on it. It is way more readable like that indeed. |
|
Looks very nice! Thanks for cleaning this up |
1 similar comment
|
Looks very nice! Thanks for cleaning this up |
d39f916 to
d43d258
Compare
|
It seems like everyone is in agreement :) @Rocketknight1 can you have a final look, and approve if you agree with the change? |
Rocketknight1
left a comment
There was a problem hiding this comment.
This looks great to me! Is the plan to wait and see how things go with BERT, and assuming no problems to repeat those edits across all model classes?
Yeah! I'm also going to open an issue with the |
What does this PR do?
Unpacks TF model inputs through a decorator, improving code clarity. To be read with issue #15908, which holds the description of the problem, the proposed solution, and future plan.
Closes #15908
How to review: start with
modeling_tf_utils.py, and then check the changes in the models. I've runRUN_SLOW=1 py.test -vv tests/model_name/test_modeling_tf_model_name.pyfor all involved models, and the changes were applied to BERT (plus its copy-linked architectures) and Speech2Text, showing that it works for multiple modalities.